Please ask about problems and questions regarding this tutorial on answers.ros.org. Don't forget to include in your question the link to this page, the versions of your OS & ROS, and also add appropriate tags. |
Setup onboard cameras (IDS ueye)
Description: This tutorial shows how to use the cameras that come with the robot by default (IDS' UEYE), those stereo cameras on the head and ones at each handtip. Also shows how to run AR tag recognition using these cameras.Tutorial Level: ADVANCED
Next Tutorial: rtmros_nextage/Tutorials/CalibrateKinect
Contents
Introduction
NEXTAGE OPEN (not Hironx) by default come with 4 IDS' Ueye cameras, 2 of which are at the head and 1 per each hand. Handling the cameras themselves are nicely done by the driver package ueye_cam, and integrating the cameras onto the robot is basically done by nextage_ros_bridge/launch/nextage_ueye_stereo.launch and nextage_ros_bridge/launch/hands_ueye.launch.
For Hironx, camera product may be different
(Hironx owners only) Product of cameras may be different from what is described in this page, so that the way to use those cameras can be different. Please figure out the product and ask on the community. It'll also be highly appreciated if you once figure out the steps then update this wiki with the info you find.
Using your own cameras on HiroNXO
If you have other cameras you want to use than the default ones, any cameras can be integrated into your HiroNXO application, as long as there's ROS driver software for your camera. Here's a list of existing camera driver software (note: this list is manually maintained, i.e. not automated. So this can never be complete). Some active users are using Kinect/Xtion as a stereo camera on the robot's head (example with Xtion). In that case, of course, you need to update/create robot's 3D model to apply the camera's location if your application is based on the 3D model (e.g. utilizing MoveIt!/collision checking/tf-based).
Setting head cameras
Run ROS node for the head-monut stereo camera
1. Run UEYE node.
First, make sure you setup stereo camera on the robot head (remove lens cap etc...), then start stereo camera node:
roslaunch nextage_ros_bridge nextage_ueye_stereo.launch
This starts nodelets to handle UEYE camera device, using ueye_cam package. You should be able to receive image topics that ueye_cam publishes (explained here by:
rostopic echo left/image_raw rostopic echo right/image_raw
You can also see the images by GUI tools like rqt_image_view.
Calibrate stereo camera
Before you start using stereo camera, you need to run stereo camera calibration tool (see http://wiki.ros.org/camera_calibration/Tutorials/StereoCalibration for detail)
and runs stereo camera calibration tools by a following command and save and commit the results
rosrun camera_calibration cameracalibrator.py --size 4x6 --square 0.024 --approximate=0.1 --camera_name=stereo_frame right:=/right/image_raw left:=/left/image_raw right_camera:=/right left_camera:=/left
Calibrated data is saved under <your home directory>/.ros/camera_info, so do not delete them.
Confirm stereo camera calibration and see disparity image
After you calibrate stereo camera, restart camera node to make sure that your node publish calibrated stereo image:
roslaunch nextage_ros_bridge nextage_ueye_stereo.launch rosrun image_view stereo_view stereo:=/ image:=image_raw _approximate_sync:=True
See http://wiki.ros.org/stereo_image_proc/Tutorials/ChoosingGoodStereoParameters for choosing good stereo parameters
Cameras at hand tip
The following starts cameras at both hands.
$ roslaunch nextage_ros_bridge hands_ueye.launch
You can subscribe images taken by:
$ rostopic echo /left_hand_ueye/image_raw (left) $ rostopic echo /right_hand_ueye/image_raw (right)
AR marker recognition
Run:
$ roslaunch nextage_ros_bridge ar.launch
This publishes tf values of any AR marker that is recognized by the camera (only markers that are predefined in ar_track_alvar package) .